Introduction to Continuous Entropy

نویسنده

  • Charles Marsh
چکیده

Classically, Shannon entropy was formalized over discrete probability distributions. However, the concept of entropy can be extended to continuous distributions through a quantity known as continuous (or differential) entropy. The most common definition for continuous entropy is seemingly straightforward; however, further analysis reveals a number of shortcomings that render it far less useful than it appears. Instead, relative entropy (or KL divergence) proves to be the key to information theory in the continuous case, as the notion of comparing entropy across probability distributions retains value. Expanding off this notion, we present several results in the field of maximum entropy and, in particular, conclude with an information-theoretic proof of the Central Limit Theorem using continuous relative entropy.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Entropy operator for continuous dynamical systems of finite topological entropy

In this paper we introduce the concept of entropy operator for continuous systems of finite topological entropy. It is shown that it generates the Kolmogorov entropy as a special case. If $phi$ is invertible then the entropy operator is bounded with the topological entropy of $phi$ as its norm.

متن کامل

A cross entropy algorithm for continuous covering location problem

Covering problem tries to locate the least number of facilities and each demand has at least one facility located within a specific distance.This paper considers a cross entropy algorithm for solving the mixed integer nonlinear programming (MINLP) for covering location model.The model is solved to determine the best covering value.Also, this paper proposes aCross Entropy (CE) algorithm consider...

متن کامل

Entropy Estimate for Maps on Forests

A 1993 result of J. Llibre, and M. Misiurewicz, (Theorem A [5]), states that if a continuous map f of a graph into itself has an s-horseshoe, then the topological entropy of f is greater than or equal to logs, that is h( f ) ? logs. Also a 1980 result of L.S. Block, J. Guckenheimer, M. Misiurewicz and L.S. Young (Lemma 1.5 [3]) states that if G is an A-graph of f then h(G) ? h( f ). In this pap...

متن کامل

Modeling of the Maximum Entropy Problem as an Optimal Control Problem and its Application to Pdf Estimation of Electricity Price

In this paper, the continuous optimal control theory is used to model and solve the maximum entropy problem for a continuous random variable. The maximum entropy principle provides a method to obtain least-biased probability density function (Pdf) estimation. In this paper, to find a closed form solution for the maximum entropy problem with any number of moment constraints, the entropy is consi...

متن کامل

Cross-Entropy Method

The cross-entropy method is a recent versatile Monte Carlo technique. This article provides a brief introduction to the cross-entropy method and discusses how it can be used for rare-event probability estimation and for solving combinatorial, continuous, constrained and noisy optimization problems. A comprehensive list of references on cross-entropy methods and applications is included.

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2013